2 research outputs found

    Development of reduced-order models for predicting the plastic deformation of metals employing material knowledge systems.

    Get PDF
    Metal alloys being explored for structural applications exhibit a complex polycrystalline internal structure that intrinsically spans multiple length-scales. Therefore, rational design efforts for such alloys require a multiscale modeling framework capable of adequately incorporating the appropriate physics that control/drive the plastic deformation at the different length scales when modeling the overall plastic response of the alloy. The establishment of the desired multiscale modeling frameworks requires the development of low-computational cost, non-iterative, frameworks capable of accurately localizing the anisotropic plastic response of polycrystalline microstructures. This dissertation addresses the outlined needs by defining suitable extensions to the scale-bridging, data-driven Material Knowledge System Framework. The extensions detailed in the subsequent chapters enabled the first successful implementation of this framework for predicting the plastic response of polycrystalline microstructures caused by any arbitrary periodic boundary condition imposed at the macroscale. The case studies presented in this work demonstrate that the localization models developed using the MKS framework are of low-computational cost and non-iterative. Nevertheless, their predictions are not as accurate as desired. As a result, leveraging the insights obtained from the implementation of this framework to polycrystalline plasticity, this dissertation provides a robust protocol to incorporate deep learning approaches in order to provide better predictions of the local plastic response in polycrystalline RVEs. The final case study performed in this dissertation establishes that the most robust approaches to develop accurate localization reduced-order models capable of accurately predicting the local anisotropic plastic response of polycrystalline microstructures are deep learning approaches such as Convolutional Neural Networks.Ph.D

    Training Data Selection for Accuracy and Transferability of Interatomic Potentials

    Full text link
    Advances in machine learning (ML) techniques have enabled the development of interatomic potentials that promise both the accuracy of first principles methods and the low-cost, linear scaling, and parallel efficiency of empirical potentials. Despite rapid progress in the last few years, ML-based potentials often struggle to achieve transferability, that is, to provide consistent accuracy across configurations that significantly differ from those used to train the model. In order to truly realize the promise of ML-based interatomic potentials, it is therefore imperative to develop systematic and scalable approaches for the generation of diverse training sets that ensure broad coverage of the space of atomic environments. This work explores a diverse-by-construction approach that leverages the optimization of the entropy of atomic descriptors to create a very large (>2â‹…105>2\cdot10^{5} configurations, >7â‹…106>7\cdot10^{6} atomic environments) training set for tungsten in an automated manner, i.e., without any human intervention. This dataset is used to train polynomial as well as multiple neural network potentials with different architectures. For comparison, a corresponding family of potentials were also trained on an expert-curated dataset for tungsten. The models trained to entropy-optimized data exhibited vastly superior transferability compared to the expert-curated models. Furthermore, while the models trained with heavy user input (i.e., domain expertise) yield the lowest errors when tested on similar configurations, out-sample predictions are dramatically more robust when the models are trained on a deliberately diverse set of training data. Herein we demonstrate the development of both accurate and transferable ML potentials using automated and data-driven approaches for generating large and diverse training sets.Comment: Main: 13 pages, 7 Figures. Supplemental: 17 pages, 16 Figure
    corecore